17 - Variational neural annealing (Estelle Inack Affiliation, Perimeter Institute for Theoretical Physics) [ID:26727]
50 von 669 angezeigt

Welcome everyone to the CAS seminar. We have Dr. Esther Inak from the Perimeter Institute

for Theoretical Physics who did her bachelor and master studies in the University of Borea

in Cameroon. So we're actually together in the university in Cameroon and she moved to the

International Center for Theoretical Physics in Italy and the School of Advanced Studies CISA

where she did her PhD and now she moved to the Perimeter Institute for postdoc.

And today she will be speaking about variation and neural annealing.

All right, thanks a lot Marius. It's a pleasure to be here and to be invited especially by you.

And so I'm very pleased to be here and thank you for the opportunity that you've given me to present

a very recent work that I've done for this year. So basically the paper is not yet out,

hopefully before the end of the year, but there is a short version of the paper that we've published

at NERP, this international conference for machine learning and intelligence. There is a workshop

that is specialized in machine learning for physics. So basically you can have the short

version of the paper. All right, I will start. So basically what we are trying to tackle or to solve

are optimization problems. Optimization problems are ubiquitous. They occur in different areas

science and industry. So here I've just given a couple of examples of where we can find

optimization problems. The first one is what is known as the so-called traveling seismic problem.

So it's a problem of a traveler who is going through a certain number of cities once and he's

coming back to the original city he traveled. And the primary way to find the shortest path

all through the cities that he's traveling. But when you increase the number of cities,

the total number of possible path that he can take, I mean, it blows up exponentially.

So actually it's known to be in complexity theory as an NP-complete problem.

All right. Another problem is more in the quantum chemistry side of the thing. So imagine that you

have a cluster of atoms that interact with some kind of potential, let's say. And here, say,

you have the free energy or the potential energy surface of your system. So each minima describes

a state of the system where it has some sort of stability. So you could imagine that the lowest

minima gives the configuration of the system, which is the most stable one. And this optimization

problem is actually finding the configuration that minimizes this kind of potential energy

landscape. And another optimization problem is the so-called protein solving problem.

That is the problem of giving an unfolded state of a protein where it's not functional. You want

to know what is the native state of the protein. So here I said that you have a diagram of an energy

respective to configuration or the confirmation of the protein. This energy landscape usually has

an exponential number of local minima. And finding a native state of the protein is finding the lowest

minima or the digit minimum, which the protein is actually functioning, which is the native

state of the protein. So this is another kind of optimization problem. And then the last one I

gave as an example here is the so-called portfolio optimization problem. That is the problem of

finding the best way to basically manage the assets you have in your portfolio.

And then I have minima of these ones. So why are those problems interesting for physicists like me?

Especially the ones, is somebody speaking? I think somebody has mic. Maybe you could mute yourself.

So basically the reason why these problems are interesting, especially the ones that do not

seem to be directly related to physics like this one, is the fact that they could be

cast into a form where we as physicists are most familiar with, which is a form of a classical

Newtonian. So here you have interacting, you could imagine you have interacting spin system. This is

like imagine that it's power reset matrix. Let's say where the eigenstate has been up and spin down.

And then the optimization problem you're interested in solving is actually encoded into this

identity matrix that tells you how basically let's say the spins or the different consistency of your

system in turn. And solving the optimization problem is equivalent to finding the ground

state of this classical Newtonian. So in this sense, we can use the techniques that we've

developed over the years in statistical physics or in condensed matter in order to find what is the

ground state of this classical problem. But there is a catch. The catch is that for the

hardest of these problems, here I just gave an example of a glassy system in glassy physics,

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:10:34 Min

Aufnahmedatum

2020-12-16

Hochgeladen am

2020-12-16 21:39:40

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen